Variational Fair Clustering

نویسندگان

چکیده

We propose a general variational framework of fair clustering, which integrates an original Kullback-Leibler (KL) fairness term with large class clustering objectives, including prototype or graph based. Fundamentally different from the existing combinatorial and spectral solutions, our multi-term approach enables to control trade-off levels between objectives. derive tight upper bound based on concave-convex decomposition term, its Lipschitz-gradient property Pinsker’s inequality. Our can be jointly optimized various while yielding scalable solution, convergence guarantee. Interestingly, at each iteration, it performs independent update for assignment variable. Therefore, easily distributed large-scale datasets. This scalability is important as explore Unlike relaxation, formulation does not require computing eigenvalue decomposition. report comprehensive evaluations comparisons state-of-the-art methods over benchmarks, show that yield highly competitive solutions in terms

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The Variational Fair Autoencoder

We investigate the problem of learning representations that are invariant to certain nuisance or sensitive factors of variation in the data while retaining as much of the remaining information as possible. Our model is based on a variational autoencoding architecture (Kingma & Welling, 2014; Rezende et al., 2014) with priors that encourage independence between sensitive and latent factors of va...

متن کامل

Fair Clustering Through Fairlets

We study the question of fair clustering under the disparate impact doctrine, where each protected class must have approximately equal representation in every cluster. We formulate the fair clustering problem under both the k-center and the k-median objectives, and show that even with two protected classes the problem is challenging, as the optimum solution can violate common conventions—for in...

متن کامل

Variational Bayesian speaker clustering

In this paper we explore the use of Variational Bayesian (VB) learning in unsupervised speaker clustering. VB learning is a relatively new learning technique that has the capacity of doing at the same time parameter learning and model selection. We tested this approach on the NIST 1996 HUB-4 evaluation test for speaker clustering when the speaker number is a priori known and when it has to be e...

متن کامل

Exotic hadrons from quark clustering at FAIR

The CBM experiment at FAIR will probe nuclear matter at high densities and comparatively low temperatures, giving access to a region of the phase diagram of QCD not yet studied in detail. One expects to find highly compressed hadronic matter and, at higher energies, deconfined quark matter, possibly subject to strong correlations. These conditions could be favourable to the production of exotic...

متن کامل

Variational Inference for Nonparametric Multiple Clustering

Most clustering algorithms produce a single clustering solution. Similarly, feature selection for clustering tries to find one feature subset where one interesting clustering solution resides. However, a single data set may be multi-faceted and can be grouped and interpreted in many different ways, especially for high dimensional data, where feature selection is typically needed. Moreover, diff...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence

سال: 2021

ISSN: ['2159-5399', '2374-3468']

DOI: https://doi.org/10.1609/aaai.v35i12.17336